On the similarity of the entropy power inequality and the Brunn-Minkowski inequality

نویسندگان

  • Max H. M. Costa
  • Thomas M. Cover
چکیده

Correspondence On the Similarity of the Entropy Power Inequality The preceeding equations allow the entropy power inequality and the Brunn-Minkowski Inequality to be rewritten in the equivalent form (4) where X' and Y' are independent normal variables with corresponding entropies H(X') = H(X) and H(Y') = H(Y). Verification of this restatement follows from the use of (1) to show that Abstract-The entropy power inequality states that the effective variance (entropy power) of the sum of two independent random variables is greater than the sum of their effective variances. The Brunn-Minkowski inequality states that the effective radius of the set sum of two sets is greater than the sum of their effective radii. Both these inequalities are recast in a form that enhances their similarity. In spite of this similarity, there is as yet no common proof of the inequalities. Nevertheless, their intriguing similarity suggests that new results relating to entropies from known results in geometry and vice versa may be found. Two applications of this reasoning are presented. First, an isoperimetric inequality for entropy is proved that shows that the spherical normal distribution minimizes the trace of the Fisher information matrix given an entropy constraint just as a sphere minimizes the surface area given a volume constraint. Second, a theorem involving the effective radii of growing convex sets is proved. Let a random variable X have a probability density function f(x), x E R. Then its (differential) entropy H(X) is defined as H(X) =-sf(x)lnf(x)dx. Shannon's entropy power inequality [l] states that for X and Y independent random variables having density functions p(X+ Y) 2 pf(X) + ewY) (1) We wish to recast this inequality. First we observe that a normal random variable Z-Cp(z) = (l/ ~)e-zz~20z with variance o2 has entropy H(Z) =-J+ln+ = + ln2seu2. (4 By inverting, we see that if Z is normal with entropy H(Z), then its variance is (3) Thus, the entropy power inequality is an inequality between effective variances, where effective variance (entropy power) is simply the variance of the normal random variable with the same entropy.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the analogue of the concavity of entropy power in the Brunn-Minkowski theory

Elaborating on the similarity between the entropy power inequality and the Brunn-Minkowski inequality, Costa and Cover conjectured in On the similarity of the entropy power inequality and the BrunnMinkowski inequality (IEEE Trans. Inform. Theory 30 (1984), no. 6, 837-839) the 1 n -concavity of the outer parallel volume of measurable sets as an analogue of the concavity of entropy power. We inve...

متن کامل

Volume difference inequalities for the projection and intersection bodies

In this paper, we introduce a new concept of volumes difference function of the projection and intersection bodies. Following this, we establish the Minkowski and Brunn-Minkowski inequalities for volumes difference function of the projection and intersection bodies.

متن کامل

Volumes of Restricted Minkowski Sums and the Free Analogue of the Entropy Power Inequality

In noncommutative probability theory independence can be based on free products instead of tensor products. This yields a highly noncommutative theory: free probability (for an introduction see [9]). The analogue of entropy in the free context was introduced by the second named author in [8]. Here we show that Shannon's entropy power inequality ([6],[1]) has an analogue for the free entropy χ(X...

متن کامل

Dimensional behaviour of entropy and information

We develop an information-theoretic perspective on some questions in convex geometry, providing for instance a new equipartition property for log-concave probability measures, some Gaussian comparison results for log-concave measures, an entropic formulation of the hyperplane conjecture, and a new reverse entropy power inequality for log-concave measures analogous to V. Milman’s reverse Brunn-M...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 30  شماره 

صفحات  -

تاریخ انتشار 1984